Improved Sparse Low-Rank Matrix Estimation

نویسندگان

  • Ankit Parekh
  • Ivan W. Selesnick
چکیده

We address the problem of estimating a sparse low-rank matrix from its noisy observation. We propose an objective function consisting of a data-fidelity term and two parameterized non-convex penalty functions. Further, we show how to set the parameters of the non-convex penalty functions, in order to ensure that the objective function is strictly convex. The proposed objective function better estimates sparse low-rank matrices than a convex method which utilizes the sum of the nuclear norm and the l1 norm. We derive an algorithm (as an instance of ADMM) to solve the proposed problem, and guarantee its convergence provided the scalar augmented Lagrangian parameter is set appropriately. We demonstrate the proposed method for denoising an audio signal and an adjacency matrix representing protein interactions in the ‘Escherichia coli’ bacteria.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Cramér-Rao Bound for Sparse Signals Fitting the Low-Rank Model with Small Number of Parameters

In this letter, we consider signals with a low-rank covariance matrix which reside in a low-dimensional subspace and can be written in terms of a finite (small) number of parameters. Although such signals do not necessarily have a sparse representation in a finite basis, they possess a sparse structure which makes it possible to recover the signal from compressed measurements. We study the stat...

متن کامل

Speaker adaptation based on sparse and low-rank eigenphone matrix estimation

The eigenphone based speaker adaptation outperforms the conventional MLLR and eigenvoice methods when the adaptation data is sufficient, but it suffers from severe over-fitting when the adaptation data is limited. In this paper, l1 and nuclear norm regularization are applied simultaneously to obtain a more robust eigenphone estimation, resulting in a sparse and low-rank eigenphone matrix. The s...

متن کامل

Near-Optimal Estimation of Simultaneously Sparse and Low-Rank Matrices from Nested Linear Measurements

In this paper we consider the problem of estimating simultaneously low-rank and row-wise sparse matrices from nested linear measurements where the linear operator consists of the product of a linear operatorW and a matrix Ψ . Leveraging the nested structure of the measurement operator, we propose a computationally efficient two-stage algorithm for estimating the simultaneously structured target...

متن کامل

Improved Deterministic Conditions for Sparse and Low-Rank Matrix Decomposition

In this paper, the problem of splitting a given matrix into sparse and low-rank matrices is investigated. The problem is when and how we can exactly do this decomposition. This problem is ill-posed in general and we need to impose some (sufficient) conditions to be able to decompose a matrix into sparse and low-rank matrices. This conditions can be categorized into two general classes: (a) dete...

متن کامل

Fundamental limits of symmetric low-rank matrix estimation

We consider the high-dimensional inference problem where the signal is a low-rank symmetric matrix which is corrupted by an additive Gaussian noise. Given a probabilistic model for the low-rank matrix, we compute the limit in the large dimension setting for the mutual information between the signal and the observations, as well as the matrix minimum mean square error, while the rank of the sign...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Signal Processing

دوره 139  شماره 

صفحات  -

تاریخ انتشار 2017